Bayesian optimization over the probability simplex

نویسندگان

چکیده

Abstract Gaussian Process based Bayesian Optimization is largely adopted for solving problems where the inputs are in Euclidean spaces. In this paper we associate to discrete probability distributions which elements of simplex. To search new design space, need a distance between distributions. The optimal transport (aka Wasserstein distance) chosen due its mathematical structure and computational strategies enabled by it. Both GP acquisition function generalized an functional over optimize two methods proposed, one on auto differentiation other proximal-point algorithm gradient flow. Finally, report preliminary set results class whose dimension ranges from 5 100. These show that embedding optimization process simplex enables effective performance standard improves with increase problem dimensionality.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian optimization for computationally extensive probability distributions

An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribut...

متن کامل

Bayesian Simulation Optimization Using Augmented Probability Simulation

Under the subject expected utility paradigm, decisions are made by finding the alternative with the maximum expected utility. In Bayesian simulation, the probability distribution used is the distribution of a simulation output. While methods have been developed under the Bayesian paradigm for choosing the best simulated system from a discrete, finite set of alternatives, the only methods for op...

متن کامل

Bayesian perspective over time

Thomas Bayes, the founder of Bayesian vision, entered the University of Edinburgh in 1719 to study logic and theology. Returning in 1722, he worked with his father in a small church. He also was a mathematician and in 1740 he made a novel discovery which he never published, but his friend Richard Price found it in his notes after his death in 1761, reedited it and published it. But until L...

متن کامل

Improving probability bounds by optimization over subsets

The simple device of maximization over subsets of events can provide substantial improvement over the Dawson–Sankoff degree two lower bound on the probability of a union of events and can also exceed a sharper bound that uses individual and pairwise joint event probabilities developed by Kuai, Alajaji, and Takahara. In each of their examples, the maximized bound achieves the exact probability o...

متن کامل

On the complexity of optimization over the standard simplex

We review complexity results for minimizing polynomials over the standard simplex and unit hypercube. In addition, we derive new results on the computational complexity of approximating the minimum of some classes of functions (including Lipschitz continuous functions) on the standard simplex. The main tools used in the analysis are Bernstein approximation and Lagrange interpolation on the simp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Mathematics and Artificial Intelligence

سال: 2023

ISSN: ['1573-7470', '1012-2443']

DOI: https://doi.org/10.1007/s10472-023-09883-w